Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning
نویسندگان
چکیده
Very deep convolutional networks have been central to the largest advances in image recognition performance in recent years. One example is the Inception architecture that has been shown to achieve very good performance at relatively low computational cost. Recently, the introduction of residual connections in conjunction with a more traditional architecture has yielded state-of-the-art performance in the 2015 ILSVRC challenge; its performance was similar to the latest generation Inception-v3 network. This raises the question: Are there any benefits to combining Inception architectures with residual connections? Here we give clear empirical evidence that training with residual connections accelerates the training of Inception networks significantly. There is also some evidence of residual Inception networks outperforming similarly expensive Inception networks without residual connections by a thin margin. We also present several new streamlined architectures for both residual and nonresidual Inception networks. These variations improve the single-frame recognition performance on the ILSVRC 2012 classification task significantly. We further demonstrate how proper activation scaling stabilizes the training of very wide residual Inception networks. With an ensemble of three residual and one Inception-v4 networks, we achieve 3.08% top-5 error on the test set of the ImageNet classification (CLS) challenge.
منابع مشابه
Improved Inception-Residual Convolutional Neural Network for Object Recognition
Machine learning and computer vision have driven many of the greatest advances in the modeling of Deep Convolutional Neural Networks (DCNNs). Nowadays, most of the research has been focused on improving recognition accuracy with better DCNN models and learning approaches. The recurrent convolutional approach is not applied very much, other than in a few DCNN architectures. On the other hand, In...
متن کاملA Deep Residual Inception Network for HEp-2 Cell Classification
Indirect-immunofluorescence (IIF) of Human Epithelial-2 (HEp-2) cells is a commonly-used method for the diagnosis of autoimmune diseases. Traditional approach relies on specialists to observe HEp-2 slides via the fluorescence microscope, which suffers from a number of shortcomings like being subjective and labor intensive. In this paper, we proposed a hybrid deep learning network combining the ...
متن کاملDeep Learning Based Food Recognition
Food is the cornerstone of people’s life. Nowadays more and more people cares about the dietary intake since unhealthy diet leads to numerous diseases, like obesity and diabetes. Accurately labelling food items is significantly essential to keep fit and live a healthy life. However, currently referring to nutrition experts or Amazon Mechanical Turk is the only way to recognize the food items. I...
متن کاملA hybrid EEG-based emotion recognition approach using Wavelet Convolutional Neural Networks (WCNN) and support vector machine
Nowadays, deep learning and convolutional neural networks (CNNs) have become widespread tools in many biomedical engineering studies. CNN is an end-to-end tool which makes processing procedure integrated, but in some situations, this processing tool requires to be fused with machine learning methods to be more accurate. In this paper, a hybrid approach based on deep features extracted from Wave...
متن کاملAudio Bird Classification with Inception-v4 extended with Time and Time-Frequency Attention Mechanisms
We present an adaptation of the deep convolutional network Inception-v4 tailored to solving bioacoustic classification problems. Bird sound classification was treated as if it were an image classification problem by a transfer learning of Inception. Inception, the state-of-the-art in image classification, was used together with an attention algorithm, to (multiscale) time-frequency representati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017